8,232 research outputs found

    On estimating scale invariance in stratocumulus cloud fields

    Get PDF
    Examination of cloud radiance fields derived from satellite observations sometimes indicates the existence of a range of scales over which the statistics of the field are scale invariant. Many methods were developed to quantify this scaling behavior in geophysics. The usefulness of such techniques depends both on the physics of the process being robust over a wide range of scales and on the availability of high resolution, low noise observations over these scales. These techniques (area perimeter relation, distribution of areas, estimation of the capacity, d0, through box counting, correlation exponent) are applied to the high resolution satellite data taken during the FIRE experiment and provides initial estimates of the quality of data required by analyzing simple sets. The results of the observed fields are contrasted with those of images of objects with known characteristics (e.g., dimension) where the details of the constructed image simulate current observational limits. Throughout when cloud elements and cloud boundaries are mentioned; it should be clearly understood that by this structures in the radiance field are meant: all the boundaries considered are defined by simple threshold arguments

    Studies of Penetration of Phenol-Formaldehyde Resin into Wood Cell Walls with the SEM and Energy-Dispersive X-ray Analyzer1

    Get PDF
    The following technical note is offered as an extension of and a rebuttal to the article by Bernard M. Collett in the Summer 1970 issue of Wood and Fiber, 2(2): 113-133. We were impressed by the historical review and technical coverage given to SEM by Mr. Collett. However, we felt that readers of Wood and Fiber would be left with an erroneous, or at least incomplete, impression of the analytical capabilities of this instrument. Admittedly, accessories are required to accomplish what is described in the following preliminary note. Nevertheless, we were of the opinion that some indication of this type of instrumentation potential should have been made in the article. Actually, his diagram in Fig. 3 included X-ray detection, but no mention of its use was made. In fact, the emphasis on the secondary electron detection mode when used for studies of the interface between wood substrate and adhesive or coating was perhaps too great

    Demonstrating the value of larger ensembles in forecasting physical systems

    Get PDF
    Ensemble simulation propagates a collection of initial states forward in time in a Monte Carlo fashion. Depending on the fidelity of the model and the properties of the initial ensemble, the goal of ensemble simulation can range from merely quantifying variations in the sensitivity of the model all the way to providing actionable probability forecasts of the future. Whatever the goal is, success depends on the properties of the ensemble, and there is a longstanding discussion in meteorology as to the size of initial condition ensemble most appropriate for Numerical Weather Prediction. In terms of resource allocation: how is one to divide finite computing resources between model complexity, ensemble size, data assimilation and other components of the forecast system. One wishes to avoid undersampling information available from the model’s dynamics, yet one also wishes to use the highest fidelity model available. Arguably, a higher fidelity model can better exploit a larger ensemble; nevertheless it is often suggested that a relatively small ensemble, say ~16 members, is sufficient and that larger ensembles are not an effective investment of resources. This claim is shown to be dubious when the goal is probabilistic forecasting, even in settings where the forecast model is informative but imperfect. Probability forecasts for a ‘simple’ physical system are evaluated at different lead times; ensembles of up to 256 members are considered. The pure density estimation context (where ensemble members are drawn from the same underlying distribution as the target) differs from the forecasting context, where one is given a high fidelity (but imperfect) model. In the forecasting context, the information provided by additional members depends also on the fidelity of the model, the ensemble formation scheme (data assimilation), the ensemble interpretation and the nature of the observational noise. The effect of increasing the ensemble size is quantified by its relative information content (in bits) using a proper skill score. Doubling the ensemble size is demonstrated to yield a non-trivial increase in the information content (forecast skill) for an ensemble with well over 16 members; this result stands in forecasting a mathematical system and a physical system. Indeed, even at the largest ensemble sizes considered (128 and 256), there are lead times where the forecast information is still increasing with ensemble size. Ultimately, model error will limit the value of ever larger ensembles. No support is found, however, for limiting design studies to the sizes commonly found in seasonal and climate studies. It is suggested that ensemble size be considered more explicitly in future design studies of forecast systems on all time scales

    Escape from model-land

    Get PDF
    Both mathematical modelling and simulation methods in general have contributed greatly to understanding, insight and forecasting in many fields including macroeconomics. Nevertheless, we must remain careful to distinguish model-land and model-land quantities from the real world. Decisions taken in the real world are more robust when informed by estimation of real-world quantities with transparent uncertainty quantification, than when based on “optimal” model-land quantities obtained from simulations of imperfect models optimized, perhaps optimal, in model-land. The authors present a short guide to some of the temptations and pitfalls of model-land, some directions towards the exit, and two ways to escape. Their aim is to improve decision support by providing relevant, adequate information regarding the real-world target of interest, or making it clear why today’s model models are not up to that task for the particular target of interest

    The Need for Continued Development of Ricin Countermeasures

    Get PDF
    Ricin toxin, an extremely potent and heat-stable toxin produced from the bean of the ubiquitous Ricinus communis (castor bean plant), has been categorized by the US Centers for Disease Control and Prevention (CDC) as a category B biothreat agent that is moderately easy to disseminate. Ricin has the potential to be used as an agent of biological warfare and bioterrorism. Therefore, there is a critical need for continued development of ricin countermeasures. A safe and effective prophylactic vaccine against ricin that was FDA approved for “at risk” individuals would be an important first step in assuring the availability of medical countermeasures against ricin

    An assessment of the foundational assumptions inhigh-resolution climate projections: the case of UKCP09

    Get PDF
    The United Kingdom Climate Impacts Programme’s UKCP09 project makes highresolution projections of the climate out to 2100 by post-processing the outputs of a large-scale global climate model. The aim of this paper is to describe and analyse the methodology used and then urge some caution. Given the acknowledged systematic, shared shortcomings in all current climate models, treating model outputs as decision relevant projections can be significantly misleading. In extrapolatory situations, such as projections of future climate change impacts, there is little reason to expect that postprocessing of model outputs can correct for the consequences of such errors. This casts doubt on our ability, today, to make trustworthy, high-resolution probabilistic projections out to the end of this century

    Ricin Perspective in Bioterrorism

    Get PDF

    Multiwavelength Observations of GX 339-4 in 1996; 3, Keck Spectroscopy

    Get PDF
    As part of our multiwavelength campaign of observations of GX 339-4 in 1996 we present our Keck spectroscopy performed on May 12 UT. At this time, neither the ASM on the RXTE nor BATSE on the CGRO detected the source. The optical emission was still dominated by the accretion disk with V approximately 17 mag. The dominant emission line is H alpha, and for the first time we are able to resolve a double peaked profile. The peak separation Delta v = 370 +/- 40 km/s. Double peaked H alpha emission lines have been seen in the quiescent optical counterparts of many black hole X-ray novae. However, we find that the peak separation is significantly smaller in GX 339-4, implying that the optical emission comes from a larger radius than in the novae. The H alpha emission line may be more akin to the one in Cygnus X-1, where it is very difficult to determine if the line is intrinsically double peaked because absorption and emission lines from the companion star dominate
    corecore